test(openai-agents): Replace mocks with httpx types for streamed responses#5580
test(openai-agents): Replace mocks with httpx types for streamed responses#5580alexander-alderman-webb wants to merge 4 commits intomasterfrom
httpx types for streamed responses#5580Conversation
Codecov Results 📊✅ 13 passed | Total: 13 | Pass Rate: 100% | Execution Time: 4.89s All tests are passing successfully. ✅ Patch coverage is 100.00%. Project has 13794 uncovered lines. Generated by Codecov Action |
Semver Impact of This PR🟢 Patch (bug fixes) 📋 Changelog PreviewThis is how your changes will appear in the changelog. Bug Fixes 🐛
Documentation 📚
Internal Changes 🔧
🤖 This preview updates automatically when you update the PR. |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| ] | ||
| ) | ||
| ), | ||
| ) |
There was a problem hiding this comment.
Async generator passed to wrong httpx.Response parameter
High Severity
The httpx.Response constructor's content parameter expects bytes, but the code passes an async generator (async_iterator(sse_chunks(...))) to it. For streaming content, the correct parameter is stream, which accepts an AsyncByteStream (async generators satisfy this via __aiter__). The openai-python test pattern that this PR claims to follow actually uses stream=MockStream(body), not content=. Passing a non-bytes object to content will cause the tests to fail — either immediately at construction time with a TypeError, or at read time when httpx tries to treat the async generator as bytes.
Additional Locations (1)
httpx types for streamed responses


Description
Replace mocks with
openaitypes to avoid test failures like when library internals change.Move
async_iterator()toconftest.pyIssues
Reminders
tox -e linters.feat:,fix:,ref:,meta:)